Taller 3 Regresión y Clustering

Este taller es para realizar problemas de Regression y Clasificación

Punto 1. Defina un Dataset

Encuentre dos data set a utilizar:

  • Uno para un problema de clasificación
  • Uno para un problema de regression

Recomendación: Utilice los dataset disponisble en la librería scikit-learn http://scikit-learn.org/stable/datasets/

Cada uno escoja un par de dataset diferentes.


In [3]:
#Librerias requeridas para el ejercicio
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
import pandas as pd
%matplotlib inline

from sklearn import datasets, linear_model
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.ensemble import GradientBoostingRegressor
    
import statsmodels.api as sm
import statsmodels.formula.api as smf

In [4]:
#Se selecciona la base de datos Boston para realizar el problema de regresión
boston = datasets.load_boston()
#Se selecciona la base de datos Iris para realizar el problema de clasificación
iris = datasets.load_iris()

Punto 2.1. Regression

Realice una regresión lineal del dataset elegido.


In [5]:
#Descargar la base de datos
boston = datasets.load_boston()
boston.DESCR # Descripcion de la base de datos
boston.keys() #Mirar las claves del diccionario


Out[5]:
dict_keys(['data', 'target', 'feature_names', 'DESCR'])

In [5]:
#convertir la base de datos Boston en un Dataframe de pandas
newboston = pd.DataFrame(boston.data, columns=boston.feature_names)
newboston['PRICE'] = boston.target # Añadir el precio de las casas
newboston.head()


Out[5]:
CRIM ZN INDUS CHAS NOX RM AGE DIS RAD TAX PTRATIO B LSTAT PRICE
0 0.00632 18.0 2.31 0.0 0.538 6.575 65.2 4.0900 1.0 296.0 15.3 396.90 4.98 24.0
1 0.02731 0.0 7.07 0.0 0.469 6.421 78.9 4.9671 2.0 242.0 17.8 396.90 9.14 21.6
2 0.02729 0.0 7.07 0.0 0.469 7.185 61.1 4.9671 2.0 242.0 17.8 392.83 4.03 34.7
3 0.03237 0.0 2.18 0.0 0.458 6.998 45.8 6.0622 3.0 222.0 18.7 394.63 2.94 33.4
4 0.06905 0.0 2.18 0.0 0.458 7.147 54.2 6.0622 3.0 222.0 18.7 396.90 5.33 36.2

In [246]:
#Hacer la regresión lineal multiple, usando para ello el precio de las casas
#Se crea un modelo ajustado con todas las características de la base de datos Boston
lm= smf.ols(formula='PRICE ~ CRIM + ZN + INDUS + CHAS + NOX + RM + AGE + DIS + RAD + TAX + PTRATIO + B + LSTAT',
                            data=newboston).fit()

In [224]:
fig, (ax1, ax2, ax3) = plt.subplots(ncols=3, nrows=1, sharey=True, figsize=(14, 6))
sns.regplot(x='PRICE', y='CRIM', ax=ax1, data=newboston)
sns.regplot(x='PRICE', y='ZN', ax=ax2, data=newboston)
sns.regplot(x='PRICE', y='AGE', ax=ax3, data=newboston)

fig, (ax1, ax4, ax5) = plt.subplots(ncols=3, nrows=1, sharey=True, figsize=(14, 6))
sns.regplot(x='PRICE', y='DIS', ax=ax1, data=newboston)
sns.regplot(x='PRICE', y='LSTAT', ax=ax4, data=newboston)
sns.regplot(x='PRICE', y='RM', ax=ax5, data=newboston)

fig, (ax1, ax2, ax3) = plt.subplots(ncols=3, nrows=1, sharey=True, figsize=(14, 6))
sns.regplot(x='PRICE', y='INDUS', ax=ax1, data=newboston)
sns.regplot(x='PRICE', y='PTRATIO', ax=ax2, data=newboston)
sns.regplot(x='PRICE', y='RAD', ax=ax3, data=newboston)

fig, (ax3, ax4) = plt.subplots(ncols=2, nrows=1, sharey=True, figsize=(8,5))
sns.regplot(x='PRICE', y='NOX', ax=ax3, data=newboston)
sns.regplot(x='PRICE', y='CHAS', ax=ax4, data=newboston)

fig, (ax3, ax4) = plt.subplots(ncols=2, nrows=1, sharey=True, figsize=(8, 5))
sns.regplot(x='PRICE', y='TAX', ax=ax3, data=newboston)
sns.regplot(x='PRICE', y='B', ax=ax4, data=newboston)


Out[224]:
<matplotlib.axes._subplots.AxesSubplot at 0x3ab76940>

Punto 2.2. Evalúe la calidad de la regressión

Obtenga una medida de la calidad de la regressión (e.g. R2)


In [250]:
#Resumen del modelo ajustado, con parámetros que miden la calidad del modelo
lm.summary()
#El modelo tiene un R^(2) igual a 0.741


Out[250]:
OLS Regression Results
Dep. Variable: PRICE R-squared: 0.741
Model: OLS Adj. R-squared: 0.734
Method: Least Squares F-statistic: 108.1
Date: Thu, 13 Apr 2017 Prob (F-statistic): 6.95e-135
Time: 17:15:01 Log-Likelihood: -1498.8
No. Observations: 506 AIC: 3026.
Df Residuals: 492 BIC: 3085.
Df Model: 13
Covariance Type: nonrobust
coef std err t P>|t| [95.0% Conf. Int.]
Intercept 36.4911 5.104 7.149 0.000 26.462 46.520
CRIM -0.1072 0.033 -3.276 0.001 -0.171 -0.043
ZN 0.0464 0.014 3.380 0.001 0.019 0.073
INDUS 0.0209 0.061 0.339 0.735 -0.100 0.142
CHAS 2.6886 0.862 3.120 0.002 0.996 4.381
NOX -17.7958 3.821 -4.658 0.000 -25.302 -10.289
RM 3.8048 0.418 9.102 0.000 2.983 4.626
AGE 0.0008 0.013 0.057 0.955 -0.025 0.027
DIS -1.4758 0.199 -7.398 0.000 -1.868 -1.084
RAD 0.3057 0.066 4.608 0.000 0.175 0.436
TAX -0.0123 0.004 -3.278 0.001 -0.020 -0.005
PTRATIO -0.9535 0.131 -7.287 0.000 -1.211 -0.696
B 0.0094 0.003 3.500 0.001 0.004 0.015
LSTAT -0.5255 0.051 -10.366 0.000 -0.625 -0.426
Omnibus: 178.029 Durbin-Watson: 1.078
Prob(Omnibus): 0.000 Jarque-Bera (JB): 782.015
Skew: 1.521 Prob(JB): 1.54e-170
Kurtosis: 8.276 Cond. No. 1.51e+04

Punto 3.1. Realice una clasificación

Realice una clasificación utilizando una regresión logística


In [2]:
#Descargar la base de datos
iris = datasets.load_iris()
iris.DESCR # Descripcion de la base de datos
iris.keys() #Mirar las claves del diccionario


Out[2]:
dict_keys(['data', 'target', 'target_names', 'DESCR', 'feature_names'])

In [38]:
#Convertir la base de datos Iris en un data frame de pandas
newiris = pd.DataFrame(iris.data, columns=iris.feature_names)
#Descripción de los datos
#newiris.describe()

In [39]:
X = iris.data[:, 2:4]  # tomar las ultimas dos columnas (Características del pétalo)
Y = iris.target #Variable categórica
h = 0.02  
logreg = linear_model.LogisticRegression(C=2).fit(X, Y) #Se crea un modelo ajustado de los datos

Punto 3.2. Evalúe la clasificación

Obtenga al menos dos medidas del desempeño de la clasificación (e.g. accuracy, recall)


In [40]:
# Acertividad del modelo 
logreg.score(X, Y)


Out[40]:
0.88666666666666671

In [41]:
logreg.coef_


Out[41]:
array([[-1.39284547, -2.08971797],
       [ 0.95949065, -1.81779712],
       [ 0.12305336,  3.20378251]])

In [42]:
# Representación visual de la precisión del modelo
# Se Trazar el límite de decisión. Para ello, se asigna un color a cada punto en la malla
x_min, x_max = X[:, 0].min() - .5, X[:, 0].max() + .5
y_min, y_max = X[:, 1].min() - .5, X[:, 1].max() + .5
xx, yy = np.meshgrid(np.arange(x_min, x_max, h), np.arange(y_min, y_max, h))
Z = logreg.predict(np.c_[xx.ravel(), yy.ravel()])

Z = Z.reshape(xx.shape)
plt.figure(1, figsize=(6, 5))
plt.pcolormesh(xx, yy, Z, cmap=plt.cm.Paired)

plt.scatter(X[:, 0], X[:, 1], c=Y, edgecolors='k', cmap=plt.cm.Paired)
plt.xlabel('Petal length')
plt.ylabel('Petal width')
plt.title('Clasificación con una regresión logística simple')

plt.xlim(xx.min(), xx.max())
plt.ylim(yy.min(), yy.max())
plt.xticks(())
plt.yticks(())

plt.show()


Punto 4. Otros algoritmos

Elija otros algoritmos (cada uno algoritmos diferentes), repita los ejercicios 2 y 3 con los algoritmos elegidos y compare el desempeño entre las regresiones lineal (para regresión) y logística (para clasificación).

Regresión lineal

Se hace una regresión polinomial múltiple, con el fin de comparar el desempeño de ésta con una regresión lineal simple y el algoritmo de clasificación y regresión Gradient Boosting Regression Trees (GBRT).


In [11]:
Y = boston.target #Precio de las casa (Variable dependiente)
X = boston.data # variables independientes
poly = PolynomialFeatures(degree=3) # Se hace una transformación polinómica de las variables independientes
X_ = poly.fit_transform(X)
clf = smf.OLS(Y,X_).fit() #Regresión polinomial múltiple
clf.summary()


Out[11]:
OLS Regression Results
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.962
Method: Least Squares F-statistic: 27.62
Date: Sun, 16 Apr 2017 Prob (F-statistic): 2.32e-16
Time: 22:49:11 Log-Likelihood: -292.12
No. Observations: 506 AIC: 1538.
Df Residuals: 29 BIC: 3554.
Df Model: 476
Covariance Type: nonrobust
coef std err t P>|t| [95.0% Conf. Int.]
const 7037.8405 7.88e+04 0.089 0.929 -1.54e+05 1.68e+05
x1 1.91e+04 3.05e+04 0.627 0.535 -4.32e+04 8.14e+04
x2 -1336.3902 1673.918 -0.798 0.431 -4759.936 2087.156
x3 -1.067e+04 4.16e+04 -0.257 0.799 -9.57e+04 7.44e+04
x4 -9.3118 94.332 -0.099 0.922 -202.242 183.619
x5 5620.1558 4.67e+04 0.120 0.905 -8.99e+04 1.01e+05
x6 2663.8393 7366.317 0.362 0.720 -1.24e+04 1.77e+04
x7 83.4712 173.444 0.481 0.634 -271.261 438.203
x8 176.8153 7070.600 0.025 0.980 -1.43e+04 1.46e+04
x9 -981.8248 3.25e+04 -0.030 0.976 -6.74e+04 6.54e+04
x10 -20.7590 1084.221 -0.019 0.985 -2238.240 2196.722
x11 3510.0654 1.2e+04 0.293 0.772 -2.1e+04 2.8e+04
x12 -176.6183 223.721 -0.789 0.436 -634.178 280.942
x13 86.2429 833.440 0.103 0.918 -1618.333 1790.819
x14 90.1454 1090.024 0.083 0.935 -2139.204 2319.495
x15 -528.9581 908.683 -0.582 0.565 -2387.423 1329.507
x16 -1162.4129 1198.573 -0.970 0.340 -3613.770 1288.944
x17 0.3524 1.708 0.206 0.838 -3.140 3.845
x18 -5.574e+04 3.36e+04 -1.657 0.108 -1.25e+05 1.31e+04
x19 -21.7610 453.520 -0.048 0.962 -949.313 905.791
x20 -2.5780 17.128 -0.151 0.881 -37.609 32.453
x21 246.9482 547.544 0.451 0.655 -872.906 1366.802
x22 -718.4517 3418.473 -0.210 0.835 -7710.014 6273.110
x23 109.7318 169.696 0.647 0.523 -237.335 456.798
x24 -1890.5247 2271.783 -0.832 0.412 -6536.843 2755.793
x25 17.8295 74.029 0.241 0.811 -133.577 169.236
x26 2.3161 101.805 0.023 0.982 -205.898 210.530
x27 0.6067 15.043 0.040 0.968 -30.159 31.373
x28 139.7797 204.704 0.683 0.500 -278.888 558.447
x29 -0.1357 1.015 -0.134 0.895 -2.211 1.939
x30 -2019.6397 4287.993 -0.471 0.641 -1.08e+04 6750.291
x31 30.4965 114.540 0.266 0.792 -203.765 264.758
x32 -0.0955 3.517 -0.027 0.979 -7.288 7.097
x33 45.6732 105.768 0.432 0.669 -170.646 261.993
x34 10.6319 98.598 0.108 0.915 -191.024 212.288
x35 -8.4350 12.899 -0.654 0.518 -34.817 17.947
x36 78.5570 305.015 0.258 0.799 -545.270 702.384
x37 9.0520 7.049 1.284 0.209 -5.364 23.469
x38 1.4458 17.073 0.085 0.933 -33.473 36.364
x39 421.5551 424.598 0.993 0.329 -446.845 1289.956
x40 0.0180 0.128 0.140 0.889 -0.244 0.280
x41 3.302e+04 1.1e+05 0.299 0.767 -1.92e+05 2.59e+05
x42 -139.7283 274.110 -0.510 0.614 -700.346 420.889
x43 2.4343 9.744 0.250 0.804 -17.494 22.363
x44 -153.0785 172.009 -0.890 0.381 -504.875 198.718
x45 1032.9567 2753.126 0.375 0.710 -4597.819 6663.732
x46 -77.8478 149.876 -0.519 0.607 -384.378 228.683
x47 1229.8749 3748.976 0.328 0.745 -6437.641 8897.391
x48 -7.6015 7.698 -0.987 0.332 -23.346 8.143
x49 18.5020 28.173 0.657 0.517 -39.118 76.122
x50 0.0079 0.040 0.199 0.843 -0.073 0.089
x51 -0.0084 0.036 -0.235 0.816 -0.081 0.065
x52 0.0196 0.102 0.192 0.849 -0.189 0.228
x53 -0.0628 0.266 -0.236 0.815 -0.607 0.482
x54 0.0362 0.076 0.475 0.639 -0.120 0.192
x55 -0.0356 0.067 -0.531 0.599 -0.172 0.101
x56 -0.2155 0.892 -0.242 0.811 -2.039 1.608
x57 0.0378 0.139 0.271 0.788 -0.247 0.323
x58 -0.2924 1.326 -0.220 0.827 -3.005 2.420
x59 0.0492 0.093 0.527 0.602 -0.142 0.240
x60 -3538.3194 7492.398 -0.472 0.640 -1.89e+04 1.18e+04
x61 -6759.0855 4502.577 -1.501 0.144 -1.6e+04 2449.719
x62 -35.8724 259.417 -0.138 0.891 -566.439 494.694
x63 571.1985 1.14e+04 0.050 0.961 -2.28e+04 2.4e+04
x64 -4059.7443 4.21e+04 -0.096 0.924 -9.02e+04 8.21e+04
x65 -552.6815 2179.920 -0.254 0.802 -5011.119 3905.756
x66 3147.1954 1.89e+04 0.166 0.869 -3.55e+04 4.18e+04
x67 289.0359 278.570 1.038 0.308 -280.704 858.776
x68 -457.6544 624.606 -0.733 0.470 -1735.117 819.808
x69 -313.7273 228.577 -1.373 0.180 -781.220 153.766
x70 -0.1604 6.487 -0.025 0.980 -13.429 13.108
x71 -44.3160 211.616 -0.209 0.836 -477.119 388.486
x72 -237.7255 410.905 -0.579 0.567 -1078.120 602.669
x73 23.2907 23.448 0.993 0.329 -24.666 71.248
x74 -109.4542 349.271 -0.313 0.756 -823.794 604.886
x75 -0.9397 8.776 -0.107 0.915 -18.889 17.010
x76 -66.1582 43.640 -1.516 0.140 -155.413 23.096
x77 0.1553 0.255 0.609 0.547 -0.366 0.676
x78 5.1773 8.726 0.593 0.558 -12.670 23.024
x79 10.5075 14.648 0.717 0.479 -19.450 40.465
x80 -0.4384 0.951 -0.461 0.648 -2.382 1.506
x81 -2.8319 15.172 -0.187 0.853 -33.863 28.199
x82 -0.1728 0.321 -0.537 0.595 -0.830 0.485
x83 0.1564 1.127 0.139 0.891 -2.148 2.461
x84 99.0749 160.944 0.616 0.543 -230.092 428.242
x85 9.3014 230.808 0.040 0.968 -462.754 481.357
x86 3.3523 7.241 0.463 0.647 -11.457 18.161
x87 -173.3531 476.935 -0.363 0.719 -1148.794 802.088
x88 5.3525 4.273 1.253 0.220 -3.386 14.092
x89 -10.1151 29.503 -0.343 0.734 -70.455 50.225
x90 -180.6056 1719.824 -0.105 0.917 -3698.041 3336.830
x91 -42.2703 114.892 -0.368 0.716 -277.252 192.711
x92 1176.2672 2881.424 0.408 0.686 -4716.907 7069.442
x93 -11.2339 25.220 -0.445 0.659 -62.814 40.346
x94 11.8275 65.847 0.180 0.859 -122.844 146.499
x95 1.9878 3.550 0.560 0.580 -5.273 9.248
x96 -44.8035 85.498 -0.524 0.604 -219.666 130.059
x97 1.0806 1.183 0.914 0.368 -1.338 3.499
x98 0.4460 3.085 0.145 0.886 -5.864 6.756
x99 -170.1120 825.216 -0.206 0.838 -1857.867 1517.643
x100 -2.2909 11.352 -0.202 0.841 -25.508 20.926
x101 1.6743 43.394 0.039 0.969 -87.076 90.425
x102 0.0323 0.200 0.162 0.873 -0.376 0.441
x103 0.1689 1.904 0.089 0.930 -3.725 4.063
x104 -1.5133 3.263 -0.464 0.646 -8.187 5.161
x105 0.0002 0.001 0.293 0.772 -0.001 0.001
x106 36.0240 49.527 0.727 0.473 -65.271 137.319
x107 -11.2310 124.884 -0.090 0.929 -266.647 244.185
x108 -0.1132 0.296 -0.383 0.705 -0.719 0.492
x109 0.6285 1.112 0.565 0.576 -1.646 2.903
x110 -0.0047 0.028 -0.166 0.869 -0.062 0.053
x111 -0.0014 0.002 -0.731 0.470 -0.006 0.003
x112 -0.0653 0.076 -0.855 0.400 -0.222 0.091
x113 -19.0258 172.465 -0.110 0.913 -371.757 333.705
x114 1.4936 12.898 0.116 0.909 -24.885 27.872
x115 -21.0452 57.011 -0.369 0.715 -137.646 95.556
x116 -0.0001 0.000 -0.974 0.338 -0.000 0.000
x117 -0.0001 0.002 -0.057 0.955 -0.005 0.004
x118 0.4739 1.245 0.381 0.706 -2.072 3.020
x119 1.1835 11.649 0.102 0.920 -22.642 25.009
x120 0.0140 0.173 0.081 0.936 -0.339 0.367
x121 383.4870 998.605 0.384 0.704 -1658.889 2425.864
x122 19.7817 17.415 1.136 0.265 -15.836 55.400
x123 -0.2586 0.709 -0.365 0.718 -1.710 1.192
x124 -6.8949 13.855 -0.498 0.622 -35.231 21.442
x125 -11.0115 17.031 -0.647 0.523 -45.843 23.820
x126 0.3280 0.402 0.816 0.421 -0.494 1.150
x127 -1.4375 12.964 -0.111 0.912 -27.951 25.076
x128 0.6115 1.115 0.548 0.588 -1.670 2.893
x129 1.3759 2.167 0.635 0.531 -3.057 5.808
x130 -5.1413 13.348 -0.385 0.703 -32.441 22.158
x131 -0.1700 0.452 -0.376 0.710 -1.094 0.754
x132 494.8524 2019.145 0.245 0.808 -3634.762 4624.467
x133 28.5026 22.143 1.287 0.208 -16.784 73.789
x134 0.0621 0.970 0.064 0.949 -1.923 2.047
x135 -3.9895 24.927 -0.160 0.874 -54.971 46.992
x136 36.4076 63.227 0.576 0.569 -92.907 165.722
x137 -1.0497 1.745 -0.602 0.552 -4.618 2.518
x138 -3.0343 29.105 -0.104 0.918 -62.561 56.493
x139 2.6816 3.585 0.748 0.460 -4.650 10.013
x140 3.9920 4.580 0.872 0.391 -5.376 13.359
x141 -0.0307 0.035 -0.881 0.385 -0.102 0.041
x142 0.0008 0.010 0.078 0.938 -0.020 0.022
x143 0.1234 0.225 0.550 0.587 -0.336 0.583
x144 -0.7622 1.445 -0.527 0.602 -3.718 2.194
x145 0.0695 0.140 0.495 0.624 -0.218 0.357
x146 0.3468 0.482 0.720 0.478 -0.639 1.333
x147 -0.1798 0.608 -0.296 0.770 -1.423 1.063
x148 -0.4259 0.619 -0.688 0.497 -1.692 0.840
x149 0.5528 1.248 0.443 0.661 -1.999 3.105
x150 -0.7049 1.834 -0.384 0.704 -4.457 3.047
x151 -39.8059 74.972 -0.531 0.599 -193.140 113.528
x152 -3.2549 17.785 -0.183 0.856 -39.629 33.119
x153 0.7426 0.873 0.851 0.402 -1.042 2.527
x154 18.7231 23.524 0.796 0.433 -29.388 66.835
x155 -1316.2395 2611.253 -0.504 0.618 -6656.851 4024.372
x156 57.0560 187.455 0.304 0.763 -326.333 440.445
x157 1996.2686 1316.784 1.516 0.140 -696.857 4689.394
x158 -0.0198 0.038 -0.526 0.603 -0.097 0.057
x159 0.2782 1.252 0.222 0.826 -2.282 2.839
x160 0.4876 0.640 0.762 0.452 -0.821 1.796
x161 -0.1149 0.159 -0.721 0.477 -0.441 0.211
x162 -1.8546 1.494 -1.241 0.225 -4.911 1.202
x163 39.2173 28.106 1.395 0.174 -18.265 96.700
x164 -2.8151 2.045 -1.377 0.179 -6.998 1.368
x165 22.2251 19.805 1.122 0.271 -18.280 62.730
x166 -0.0039 0.004 -0.898 0.377 -0.013 0.005
x167 0.1281 0.132 0.968 0.341 -0.143 0.399
x168 -2.292e-05 0.004 -0.006 0.995 -0.007 0.007
x169 -0.0791 0.149 -0.532 0.599 -0.384 0.225
x170 0.1409 1.335 0.106 0.917 -2.590 2.871
x171 -0.0084 0.094 -0.089 0.930 -0.200 0.184
x172 0.2079 0.668 0.311 0.758 -1.158 1.573
x173 3.066e-06 0.000 0.015 0.988 -0.000 0.000
x174 -0.0080 0.012 -0.654 0.518 -0.033 0.017
x175 -0.8052 1.737 -0.464 0.646 -4.357 2.747
x176 -3.9778 30.035 -0.132 0.896 -65.407 57.451
x177 0.4129 2.168 0.190 0.850 -4.022 4.848
x178 -17.0203 28.506 -0.597 0.555 -75.321 41.280
x179 0.0019 0.005 0.404 0.689 -0.008 0.012
x180 -0.0284 0.170 -0.167 0.869 -0.377 0.320
x181 32.6648 74.640 0.438 0.665 -119.990 185.320
x182 -2.7407 4.369 -0.627 0.535 -11.676 6.195
x183 1.5705 149.561 0.011 0.992 -304.316 307.457
x184 3.3103 6.160 0.537 0.595 -9.288 15.908
x185 5.7786 5.639 1.025 0.314 -5.755 17.312
x186 0.0317 0.041 0.770 0.448 -0.052 0.116
x187 -0.9633 2.706 -0.356 0.724 -6.498 4.571
x188 -0.2503 0.437 -0.573 0.571 -1.143 0.643
x189 -0.4099 0.427 -0.961 0.345 -1.283 0.463
x190 17.1892 53.626 0.321 0.751 -92.488 126.866
x191 1.0365 1.609 0.644 0.524 -2.254 4.326
x192 2.9352 3.322 0.884 0.384 -3.858 9.729
x193 -1.94e-05 1.65e-05 -1.177 0.249 -5.31e-05 1.43e-05
x194 0.0002 0.000 0.435 0.667 -0.001 0.001
x195 0.0064 0.011 0.593 0.558 -0.016 0.028
x196 0.0027 0.012 0.230 0.819 -0.021 0.027
x197 -0.3353 0.327 -1.026 0.313 -1.004 0.333
x198 -0.6423 0.791 -0.812 0.423 -2.260 0.975
x199 -1.5790 26.776 -0.059 0.953 -56.342 53.184
x200 -0.1061 0.073 -1.461 0.155 -0.255 0.042
x201 0.0018 0.004 0.442 0.662 -0.007 0.010
x202 -0.2741 0.428 -0.640 0.527 -1.150 0.602
x203 -0.1820 0.222 -0.821 0.418 -0.635 0.271
x204 0.0110 0.016 0.704 0.487 -0.021 0.043
x205 -0.0063 0.245 -0.026 0.980 -0.508 0.495
x206 0.0030 0.003 0.908 0.371 -0.004 0.010
x207 -0.0110 0.013 -0.847 0.404 -0.038 0.016
x208 -1.1333 1.420 -0.798 0.431 -4.037 1.771
x209 -0.4477 0.688 -0.651 0.520 -1.854 0.959
x210 -360.3725 585.957 -0.615 0.543 -1558.788 838.043
x211 0.2862 0.978 0.293 0.772 -1.715 2.287
x212 -0.0011 0.014 -0.078 0.938 -0.030 0.028
x213 -3.5418 6.403 -0.553 0.584 -16.638 9.555
x214 -5.6266 13.241 -0.425 0.674 -32.708 21.455
x215 0.1299 0.235 0.554 0.584 -0.350 0.610
x216 1.4920 5.100 0.293 0.772 -8.939 11.923
x217 0.0545 0.057 0.952 0.349 -0.063 0.171
x218 -0.0648 0.294 -0.220 0.827 -0.666 0.537
x219 0.0263 0.043 0.615 0.543 -0.061 0.114
x220 0.0098 0.023 0.435 0.667 -0.036 0.056
x221 0.5124 2.054 0.249 0.805 -3.689 4.714
x222 -0.4099 0.606 -0.676 0.504 -1.649 0.829
x223 1.7003 3.373 0.504 0.618 -5.199 8.599
x224 0.2249 0.548 0.410 0.685 -0.896 1.346
x225 0.1131 0.677 0.167 0.869 -1.272 1.499
x226 -0.0770 1.183 -0.065 0.949 -2.497 2.343
x227 0.0308 0.551 0.056 0.956 -1.096 1.158
x228 1.8046 2.677 0.674 0.506 -3.671 7.280
x229 535.7513 9151.590 0.059 0.954 -1.82e+04 1.93e+04
x230 50.1436 108.737 0.461 0.648 -172.249 272.536
x231 -0.5855 3.019 -0.194 0.848 -6.760 5.589
x232 43.2864 233.215 0.186 0.854 -433.692 520.264
x233 -104.3570 252.514 -0.413 0.682 -620.806 412.092
x234 9.9392 11.711 0.849 0.403 -14.012 33.890
x235 112.5675 222.496 0.506 0.617 -342.488 567.623
x236 -5.8180 5.776 -1.007 0.322 -17.631 5.995
x237 11.8774 12.713 0.934 0.358 -14.123 37.878
x238 0.5865 2.844 0.206 0.838 -5.229 6.402
x239 -0.1019 0.156 -0.654 0.518 -0.421 0.217
x240 -0.2994 2.422 -0.124 0.902 -5.254 4.655
x241 -1.4462 0.867 -1.667 0.106 -3.220 0.328
x242 -0.0263 0.029 -0.921 0.365 -0.085 0.032
x243 -2.3834 1.457 -1.635 0.113 -5.364 0.598
x244 0.0127 0.121 0.105 0.917 -0.234 0.260
x245 -0.1570 0.989 -0.159 0.875 -2.179 1.865
x246 0.0033 0.002 1.883 0.070 -0.000 0.007
x247 -0.0231 0.048 -0.483 0.633 -0.121 0.075
x248 0.0356 0.024 1.468 0.153 -0.014 0.085
x249 0.0005 0.001 0.534 0.597 -0.001 0.002
x250 0.0407 0.059 0.692 0.494 -0.080 0.161
x251 -3.505e-06 0.004 -0.001 0.999 -0.008 0.008
x252 -0.0357 0.024 -1.508 0.142 -0.084 0.013
x253 1.1151 1.765 0.632 0.532 -2.494 4.725
x254 -0.3318 2.793 -0.119 0.906 -6.044 5.380
x255 0.0209 0.037 0.560 0.580 -0.056 0.097
x256 0.9246 2.200 0.420 0.677 -3.574 5.423
x257 -0.1809 0.061 -2.971 0.006 -0.305 -0.056
x258 0.0251 0.286 0.088 0.931 -0.561 0.611
x259 2.3795 11.092 0.215 0.832 -20.305 25.064
x260 0.1247 0.142 0.878 0.387 -0.166 0.415
x261 0.7206 7.123 0.101 0.920 -13.847 15.288
x262 0.0446 0.055 0.805 0.427 -0.069 0.158
x263 0.0767 0.190 0.403 0.690 -0.313 0.466
x264 -0.0023 0.003 -0.844 0.406 -0.008 0.003
x265 0.1757 0.367 0.479 0.636 -0.575 0.926
x266 0.0023 0.001 1.761 0.089 -0.000 0.005
x267 -0.0082 0.006 -1.436 0.162 -0.020 0.003
x268 -4.9990 11.091 -0.451 0.656 -27.682 17.684
x269 0.0208 0.058 0.357 0.724 -0.098 0.140
x270 -0.2557 0.137 -1.872 0.071 -0.535 0.024
x271 -0.0094 0.007 -1.293 0.206 -0.024 0.005
x272 0.0068 0.027 0.256 0.800 -0.048 0.061
x273 0.0021 0.070 0.030 0.976 -0.142 0.146
x274 6.8360 10.165 0.672 0.507 -13.954 27.626
x275 -0.2529 0.524 -0.483 0.633 -1.324 0.819
x276 -812.0929 981.574 -0.827 0.415 -2819.637 1195.451
x277 2.2811 1.636 1.394 0.174 -1.065 5.627
x278 -0.0147 0.037 -0.398 0.694 -0.091 0.061
x279 -2.2994 1.564 -1.470 0.152 -5.498 0.900
x280 -16.0219 35.546 -0.451 0.656 -88.721 56.677
x281 -0.1994 0.611 -0.327 0.746 -1.448 1.049
x282 -5.0383 17.471 -0.288 0.775 -40.770 30.693
x283 -0.0010 0.040 -0.024 0.981 -0.083 0.081
x284 0.1368 0.216 0.633 0.532 -0.305 0.579
x285 -0.0148 0.027 -0.548 0.588 -0.070 0.041
x286 -0.0782 0.048 -1.639 0.112 -0.176 0.019
x287 -0.2894 0.842 -0.344 0.733 -2.011 1.432
x288 0.2964 1.054 0.281 0.781 -1.859 2.452
x289 -0.6319 0.821 -0.769 0.448 -2.312 1.048
x290 -0.0440 0.121 -0.364 0.719 -0.291 0.203
x291 -0.9828 1.716 -0.573 0.571 -4.493 2.527
x292 0.0048 0.328 0.015 0.988 -0.666 0.676
x293 0.5974 1.162 0.514 0.611 -1.778 2.973
x294 3.5682 2.864 1.246 0.223 -2.289 9.426
x295 -1.265e+04 5.18e+04 -0.244 0.809 -1.19e+05 9.33e+04
x296 -9.2580 150.763 -0.061 0.951 -317.603 299.087
x297 -2.6090 9.101 -0.287 0.776 -21.222 16.004
x298 163.7107 418.157 0.392 0.698 -691.517 1018.938
x299 -1108.9917 6036.144 -0.184 0.856 -1.35e+04 1.12e+04
x300 68.8395 125.120 0.550 0.586 -187.059 324.738
x301 -1374.2077 3672.528 -0.374 0.711 -8885.372 6136.956
x302 -0.7983 9.223 -0.087 0.932 -19.661 18.064
x303 -0.3237 21.387 -0.015 0.988 -44.065 43.417
x304 -2.8539 5.641 -0.506 0.617 -14.391 8.683
x305 -0.1615 0.228 -0.708 0.485 -0.628 0.305
x306 5.2282 8.734 0.599 0.554 -12.636 23.092
x307 -8.4594 6.566 -1.288 0.208 -21.888 4.969
x308 -0.0075 0.136 -0.055 0.956 -0.285 0.270
x309 -5.9939 7.632 -0.785 0.439 -21.602 9.614
x310 0.7267 0.638 1.140 0.264 -0.578 2.031
x311 -1.5138 1.339 -1.131 0.267 -4.252 1.224
x312 0.0070 0.005 1.309 0.201 -0.004 0.018
x313 0.0486 0.131 0.371 0.714 -0.220 0.317
x314 0.1735 0.153 1.136 0.265 -0.139 0.486
x315 0.0007 0.005 0.153 0.880 -0.009 0.010
x316 0.1161 0.102 1.144 0.262 -0.092 0.324
x317 -0.0088 0.015 -0.582 0.565 -0.040 0.022
x318 -0.0319 0.044 -0.723 0.475 -0.122 0.058
x319 1.3421 5.023 0.267 0.791 -8.930 11.614
x320 6.4182 11.042 0.581 0.566 -16.165 29.001
x321 -0.0741 0.121 -0.613 0.545 -0.321 0.173
x322 5.7515 3.973 1.448 0.158 -2.374 13.877
x323 -0.1198 0.142 -0.845 0.405 -0.410 0.170
x324 0.2652 1.689 0.157 0.876 -3.189 3.719
x325 20.5265 87.835 0.234 0.817 -159.117 200.170
x326 0.6356 1.090 0.583 0.564 -1.595 2.866
x327 -19.4835 18.174 -1.072 0.293 -56.654 17.687
x328 -0.4225 0.683 -0.619 0.541 -1.820 0.974
x329 -1.3747 0.937 -1.467 0.153 -3.291 0.542
x330 0.0005 0.011 0.043 0.966 -0.023 0.024
x331 2.4193 5.144 0.470 0.642 -8.101 12.940
x332 0.0059 0.010 0.596 0.556 -0.014 0.026
x333 0.0132 0.013 1.046 0.304 -0.013 0.039
x334 -32.1550 98.534 -0.326 0.747 -233.680 169.370
x335 0.1573 0.147 1.069 0.294 -0.144 0.458
x336 -0.7449 0.489 -1.522 0.139 -1.746 0.256
x337 0.0007 0.003 0.193 0.848 -0.006 0.008
x338 0.0120 0.064 0.186 0.854 -0.120 0.144
x339 -0.0612 0.076 -0.802 0.429 -0.217 0.095
x340 0.0005 0.003 0.191 0.850 -0.005 0.006
x341 -0.0036 0.002 -1.694 0.101 -0.008 0.001
x342 0.0030 0.006 0.526 0.603 -0.009 0.015
x343 -0.0485 0.181 -0.268 0.791 -0.419 0.322
x344 0.0172 0.019 0.916 0.367 -0.021 0.056
x345 -0.0183 0.048 -0.382 0.705 -0.116 0.079
x346 -0.2047 0.849 -0.241 0.811 -1.941 1.532
x347 0.0133 0.012 1.105 0.278 -0.011 0.038
x348 -0.2946 1.374 -0.214 0.832 -3.106 2.516
x349 0.0578 0.096 0.604 0.550 -0.138 0.254
x350 -0.0051 0.003 -1.539 0.135 -0.012 0.002
x351 -0.0386 0.028 -1.375 0.180 -0.096 0.019
x352 -0.4869 0.306 -1.590 0.123 -1.113 0.140
x353 -0.0077 0.005 -1.589 0.123 -0.018 0.002
x354 0.0323 0.053 0.610 0.547 -0.076 0.140
x355 -0.6835 1.039 -0.658 0.516 -2.808 1.441
x356 -0.0332 0.037 -0.890 0.381 -0.109 0.043
x357 -1.8946 1.247 -1.520 0.139 -4.444 0.655
x358 -0.0091 0.104 -0.088 0.931 -0.221 0.203
x359 0.0822 0.325 0.253 0.802 -0.583 0.747
x360 0.3632 2.003 0.181 0.857 -3.733 4.460
x361 0.1205 0.328 0.368 0.716 -0.550 0.791
x362 -0.1340 0.309 -0.434 0.667 -0.765 0.497
x363 0.0900 0.372 0.242 0.810 -0.671 0.850
x364 0.1339 0.864 0.155 0.878 -1.633 1.901
x365 -0.0911 1.112 -0.082 0.935 -2.365 2.183
x366 0.0142 1.338 0.011 0.992 -2.722 2.751
x367 0.0574 0.241 0.238 0.813 -0.435 0.550
x368 -2.1473 3.811 -0.563 0.578 -9.943 5.648
x369 0.8343 3.248 0.257 0.799 -5.808 7.477
x370 -0.0721 0.181 -0.398 0.694 -0.443 0.299
x371 1.7532 2.014 0.871 0.391 -2.365 5.872
x372 -0.0306 0.052 -0.594 0.557 -0.136 0.075
x373 -0.5512 1.732 -0.318 0.753 -4.093 2.991
x374 0.2610 0.407 0.641 0.527 -0.572 1.094
x375 -0.0836 0.174 -0.480 0.635 -0.439 0.272
x376 -0.0574 0.240 -0.239 0.813 -0.549 0.434
x377 0.2967 0.620 0.479 0.636 -0.971 1.564
x378 0.5662 0.966 0.586 0.562 -1.410 2.542
x379 0.0321 0.475 0.068 0.947 -0.940 1.004
x380 0.0629 0.206 0.306 0.762 -0.358 0.483
x381 -0.8849 2.034 -0.435 0.667 -5.046 3.276
x382 -0.2550 0.529 -0.482 0.633 -1.336 0.826
x383 0.3299 1.324 0.249 0.805 -2.377 3.037
x384 1.8152 3.099 0.586 0.563 -4.524 8.154
x385 0.0566 0.150 0.377 0.709 -0.251 0.364
x386 0.2061 1.357 0.152 0.880 -2.569 2.982
x387 -0.0207 0.088 -0.234 0.817 -0.201 0.160
x388 -0.1510 0.212 -0.712 0.482 -0.585 0.283
x389 0.3781 1.026 0.369 0.715 -1.720 2.476
x390 -0.6966 0.884 -0.788 0.437 -2.504 1.110
x391 1.2223 1.111 1.101 0.280 -1.049 3.494
x392 0.0116 0.010 1.193 0.242 -0.008 0.032
x393 0.1899 0.556 0.342 0.735 -0.947 1.327
x394 -1.3915 1.743 -0.799 0.431 -4.956 2.173
x395 -9211.2813 6327.496 -1.456 0.156 -2.22e+04 3729.902
x396 1072.3708 1046.441 1.025 0.314 -1067.841 3212.582
x397 -7.7376 36.317 -0.213 0.833 -82.014 66.539
x398 -2241.6114 1413.138 -1.586 0.124 -5131.803 648.580
x399 -9492.7476 2.17e+04 -0.438 0.665 -5.38e+04 3.48e+04
x400 1182.1760 2679.847 0.441 0.662 -4298.726 6663.078
x401 -1.53e+04 2.16e+04 -0.707 0.485 -5.96e+04 2.9e+04
x402 2.6939 2.380 1.132 0.267 -2.175 7.562
x403 -28.3970 81.132 -0.350 0.729 -194.331 137.537
x404 64.8318 44.606 1.453 0.157 -26.397 156.061
x405 8.6851 5.536 1.569 0.128 -2.636 20.007
x406 -25.6322 140.132 -0.183 0.856 -312.235 260.970
x407 -86.8214 119.032 -0.729 0.472 -330.270 156.627
x408 0.8710 11.159 0.078 0.938 -21.952 23.694
x409 260.4676 291.034 0.895 0.378 -334.764 855.699
x410 -0.0095 0.329 -0.029 0.977 -0.683 0.664
x411 11.3195 7.180 1.576 0.126 -3.366 26.005
x412 -0.0939 0.137 -0.684 0.499 -0.374 0.187
x413 4.3851 5.732 0.765 0.450 -7.338 16.108
x414 -1.2950 2.853 -0.454 0.653 -7.129 4.539
x415 0.1874 0.369 0.508 0.615 -0.567 0.941
x416 -3.0189 15.941 -0.189 0.851 -35.623 29.585
x417 0.0178 0.034 0.519 0.608 -0.052 0.088
x418 -0.0308 0.939 -0.033 0.974 -1.951 1.890
x419 -244.3540 100.725 -2.426 0.022 -450.360 -38.348
x420 -9.7922 161.231 -0.061 0.952 -339.547 319.963
x421 -4.2958 21.205 -0.203 0.841 -47.665 39.073
x422 192.3602 652.364 0.295 0.770 -1141.874 1526.595
x423 0.3147 0.412 0.764 0.451 -0.528 1.157
x424 -39.4223 17.438 -2.261 0.031 -75.087 -3.758
x425 -15.6225 3155.379 -0.005 0.996 -6469.096 6437.851
x426 55.6164 173.214 0.321 0.750 -298.646 409.879
x427 -43.4979 1941.545 -0.022 0.982 -4014.403 3927.407
x428 9.1925 6.975 1.318 0.198 -5.073 23.458
x429 -10.2989 11.550 -0.892 0.380 -33.922 13.324
x430 -2.2822 2.295 -0.994 0.328 -6.976 2.412
x431 18.3627 97.371 0.189 0.852 -180.783 217.508
x432 -0.4542 0.614 -0.740 0.465 -1.709 0.801
x433 0.1052 1.270 0.083 0.934 -2.491 2.702
x434 296.7739 1884.545 0.157 0.876 -3557.553 4151.101
x435 -9.8086 16.561 -0.592 0.558 -43.680 24.063
x436 36.0834 42.845 0.842 0.407 -51.545 123.712
x437 -0.0002 0.001 -0.134 0.894 -0.003 0.003
x438 -0.0104 0.046 -0.225 0.824 -0.105 0.084
x439 -0.7681 0.782 -0.982 0.334 -2.368 0.832
x440 1.0156 2.337 0.435 0.667 -3.764 5.795
x441 -0.0150 0.290 -0.052 0.959 -0.608 0.578
x442 1.6054 6.171 0.260 0.797 -11.015 14.226
x443 -8.3497 10.505 -0.795 0.433 -29.835 13.135
x444 0.3836 0.650 0.591 0.559 -0.945 1.712
x445 12.2872 8.325 1.476 0.151 -4.739 29.314
x446 -0.0055 0.016 -0.340 0.737 -0.039 0.028
x447 -0.1844 0.507 -0.364 0.719 -1.221 0.852
x448 -0.0063 0.013 -0.479 0.635 -0.033 0.021
x449 0.1003 0.388 0.258 0.798 -0.694 0.895
x450 -0.0801 0.421 -0.190 0.851 -0.942 0.782
x451 0.0077 0.024 0.323 0.749 -0.041 0.056
x452 -0.1092 0.371 -0.295 0.770 -0.868 0.649
x453 -0.0060 0.003 -2.022 0.053 -0.012 7.01e-05
x454 0.0050 0.114 0.044 0.965 -0.227 0.237
x455 0.3981 8.717 0.046 0.964 -17.430 18.226
x456 3.3267 6.114 0.544 0.590 -9.177 15.830
x457 -0.3379 0.532 -0.635 0.530 -1.426 0.750
x458 4.5282 6.949 0.652 0.520 -9.685 18.741
x459 -0.0665 0.056 -1.177 0.249 -0.182 0.049
x460 1.3384 2.460 0.544 0.591 -3.694 6.371
x461 -0.5780 4.532 -0.128 0.899 -9.847 8.691
x462 0.3768 0.357 1.056 0.300 -0.353 1.107
x463 1.5058 13.192 0.114 0.910 -25.475 28.486
x464 0.8265 0.649 1.272 0.213 -0.502 2.155
x465 -2.0158 3.235 -0.623 0.538 -8.631 4.600
x466 -0.0058 0.007 -0.815 0.422 -0.020 0.009
x467 -0.0519 0.307 -0.169 0.867 -0.679 0.575
x468 -0.0654 0.052 -1.270 0.214 -0.171 0.040
x469 0.1319 0.185 0.712 0.482 -0.247 0.511
x470 -10.2426 5.209 -1.966 0.059 -20.895 0.410
x471 0.5991 0.576 1.040 0.307 -0.579 1.777
x472 2.2640 2.006 1.129 0.268 -1.838 6.366
x473 0.0003 0.000 1.003 0.324 -0.000 0.001
x474 0.0052 0.005 1.066 0.295 -0.005 0.015
x475 -0.0782 0.087 -0.900 0.376 -0.256 0.100
x476 0.0001 0.000 0.315 0.755 -0.001 0.001
x477 -0.0020 0.011 -0.175 0.862 -0.025 0.021
x478 0.0050 0.007 0.719 0.478 -0.009 0.019
x479 -0.0004 0.000 -0.844 0.406 -0.001 0.001
x480 -0.0028 0.009 -0.298 0.768 -0.022 0.016
x481 8.059e-05 0.000 0.623 0.538 -0.000 0.000
x482 -0.0020 0.002 -1.125 0.270 -0.006 0.002
x483 0.1104 0.251 0.440 0.663 -0.403 0.624
x484 0.0414 0.146 0.282 0.780 -0.258 0.341
x485 -0.0013 0.008 -0.164 0.871 -0.018 0.015
x486 -0.4299 0.308 -1.395 0.174 -1.060 0.200
x487 -0.0010 0.005 -0.187 0.853 -0.012 0.010
x488 -0.0154 0.053 -0.292 0.772 -0.123 0.092
x489 -0.0054 0.217 -0.025 0.980 -0.449 0.439
x490 -0.0041 0.014 -0.288 0.776 -0.033 0.025
x491 -0.1654 0.215 -0.768 0.449 -0.606 0.275
x492 -0.0176 0.022 -0.797 0.432 -0.063 0.028
x493 -0.0295 0.084 -0.350 0.729 -0.202 0.143
x494 1.996e-05 0.000 0.099 0.922 -0.000 0.000
x495 -0.0078 0.006 -1.276 0.212 -0.020 0.005
x496 0.0012 0.002 0.741 0.465 -0.002 0.004
x497 0.0028 0.006 0.485 0.631 -0.009 0.014
x498 0.2497 0.197 1.265 0.216 -0.154 0.654
x499 0.0004 0.012 0.036 0.971 -0.023 0.024
x500 -0.0137 0.061 -0.223 0.825 -0.139 0.112
x501 -2.267e-05 2.15e-05 -1.053 0.301 -6.67e-05 2.14e-05
x502 -0.0004 0.000 -1.182 0.247 -0.001 0.000
x503 0.0024 0.005 0.486 0.630 -0.008 0.013
x504 -2.0396 4.094 -0.498 0.622 -10.413 6.334
x505 0.3138 2.870 0.109 0.914 -5.555 6.183
x506 0.0174 0.200 0.087 0.931 -0.391 0.426
x507 1.3019 4.124 0.316 0.755 -7.133 9.737
x508 -0.0318 0.077 -0.411 0.684 -0.190 0.126
x509 0.6796 1.006 0.675 0.505 -1.378 2.737
x510 1.7811 7.384 0.241 0.811 -13.320 16.883
x511 -0.3375 0.474 -0.712 0.482 -1.306 0.631
x512 -1.4944 13.561 -0.110 0.913 -29.230 26.241
x513 0.1030 0.131 0.786 0.438 -0.165 0.371
x514 -0.3660 0.666 -0.550 0.587 -1.728 0.996
x515 0.0070 0.004 1.869 0.072 -0.001 0.015
x516 -0.0985 0.188 -0.524 0.604 -0.483 0.286
x517 -0.0005 0.009 -0.053 0.958 -0.019 0.018
x518 0.0368 0.078 0.475 0.638 -0.122 0.195
x519 4.3425 8.698 0.499 0.621 -13.446 22.131
x520 -0.2365 0.221 -1.071 0.293 -0.688 0.215
x521 0.1623 1.432 0.113 0.911 -2.767 3.092
x522 -0.0002 0.000 -0.716 0.480 -0.001 0.000
x523 -0.0042 0.011 -0.372 0.713 -0.027 0.019
x524 0.1256 0.153 0.818 0.420 -0.188 0.439
x525 -16.1253 13.073 -1.233 0.227 -42.863 10.613
x526 -0.1856 3.187 -0.058 0.954 -6.703 6.332
x527 15.8004 29.454 0.536 0.596 -44.439 76.040
x528 0.1553 0.200 0.775 0.444 -0.254 0.565
x529 0.2906 0.426 0.682 0.501 -0.581 1.163
x530 0.0203 0.061 0.330 0.744 -0.105 0.146
x531 -0.4198 1.186 -0.354 0.726 -2.844 2.005
x532 0.0104 0.027 0.387 0.701 -0.044 0.065
x533 0.0432 0.047 0.922 0.364 -0.053 0.139
x534 -31.9232 57.398 -0.556 0.582 -149.315 85.469
x535 -0.1395 0.808 -0.173 0.864 -1.792 1.512
x536 -0.5586 1.531 -0.365 0.718 -3.691 2.574
x537 0.0017 0.005 0.370 0.714 -0.008 0.011
x538 0.0356 0.078 0.459 0.650 -0.123 0.195
x539 -0.0803 0.217 -0.371 0.713 -0.523 0.363
x540 -0.0005 0.002 -0.276 0.785 -0.004 0.003
x541 -0.0261 0.088 -0.296 0.769 -0.206 0.154
x542 -0.0002 0.000 -0.630 0.533 -0.001 0.000
x543 -0.0010 0.001 -1.126 0.269 -0.003 0.001
x544 1.2629 3.377 0.374 0.711 -5.644 8.170
x545 -0.0172 0.031 -0.552 0.585 -0.081 0.046
x546 -0.0199 0.033 -0.609 0.547 -0.087 0.047
x547 -0.0001 0.000 -0.460 0.649 -0.001 0.000
x548 -0.0025 0.006 -0.444 0.661 -0.014 0.009
x549 0.0056 0.012 0.452 0.654 -0.020 0.031
x550 0.5744 13.321 0.043 0.966 -26.670 27.819
x551 0.2562 0.263 0.975 0.338 -0.281 0.794
x552 -0.7311 0.565 -1.295 0.205 -1.886 0.423
x553 -0.0007 0.007 -0.096 0.924 -0.015 0.014
x554 0.0190 0.078 0.244 0.809 -0.140 0.178
x555 0.0672 0.183 0.368 0.716 -0.307 0.441
x556 5.395e-07 6.82e-07 0.791 0.436 -8.56e-07 1.93e-06
x557 5.524e-05 3.18e-05 1.735 0.093 -9.88e-06 0.000
x558 0.0002 0.000 0.475 0.638 -0.001 0.001
x559 -0.0029 0.008 -0.359 0.723 -0.019 0.014
Omnibus: 230.862 Durbin-Watson: 1.839
Prob(Omnibus): 0.000 Jarque-Bera (JB): 12365.601
Skew: -1.176 Prob(JB): 0.00
Kurtosis: 27.103 Cond. No. 1.38e+16

A continuación, se hace una regresión implementando el algoritmo de clasificación y regresión Gradient Boosting Regression Trees (GBRT)


In [9]:
Y = boston.target #Precio de las casa (Variable dependiente)
X = boston.data # variables independientes
clasificacion = GradientBoostingRegressor(n_estimators=500, learning_rate=0.05, max_depth=1, random_state=0, loss='ls').fit(X, Y)
clasificacion.score(X, Y)


Out[9]:
0.91021166377615448

A paritr de los análisis de regresión realizados para la base de datos Boston, se puede concluir que la regresión polinomial tuvo el mejor desempeño, seguida por el modelo de clasificación y regresión GBRT, con un R^(2) de 0.998 y 0.910, respectivamente.

Algoritmo de clasificación

Se implementa el algoritmo de clasificación Gradient Boosting Trees (GBRT), con el fin de comparar el desempeño de éste y una regresión logística.


In [34]:
X = iris.data[:, 2:4]  # Características del pétalo
Y = iris.target #Variable categórica

clasificacion = GradientBoostingClassifier(n_estimators=500, learning_rate=0.05,
                                        max_depth=1, random_state=0).fit(X, Y)
clasificacion.score(X, Y)


Out[34]:
0.97333333333333338

In [33]:
# Plot the decision boundary. For that, we will assign a color to each
# point in the mesh [x_min, x_max]x[y_min, y_max].
x_min, x_max = X[:, 0].min() - .5, X[:, 0].max() + .5
y_min, y_max = X[:, 1].min() - .5, X[:, 1].max() + .5
xx, yy = np.meshgrid(np.arange(x_min, x_max, h), np.arange(y_min, y_max, h))
Z = clasificacion.predict(np.c_[xx.ravel(), yy.ravel()])
# Put the result into a color plot
Z = Z.reshape(xx.shape)
plt.figure(1, figsize=(6, 5))
plt.pcolormesh(xx, yy, Z, cmap=plt.cm.Paired)
# Plot also the training points
plt.scatter(X[:, 0], X[:, 1], c=Y, edgecolors='k', cmap=plt.cm.Paired)
plt.xlabel('Petal length')
plt.ylabel('Petal width')
plt.title('Clasificación por GBRT')

plt.xlim(xx.min(), xx.max())
plt.ylim(yy.min(), yy.max())
plt.xticks(())
plt.yticks(())

plt.show()


A paritr de los análisis de clasificación realizados para la base de datos Iris, se puede concluir que con el modelo de clasificación GBRT se tuvo el mejor desempeño, seguida por la regresión logística simple, con un valor de precisión (score) de 0.97 y 0.88, respectivamente.

Taller 4

Exponer los algorithmos elegidos en el Punto 4 del Taller 3